Search results for "Local optimum"
showing 10 items of 11 documents
Automatic program for peak detection and deconvolution of multi-overlapped chromatographic signals
2005
Several interlinked algorithms for peak deconvolution by non-linear regression are presented. These procedures, together with the peak detection methods outlined in Part I, have allowed the implementation of an automatic method able to process multi-overlapped signals, requiring little user interaction. A criterion based on the evaluation of the multivariate selectivity of the chromatographic signal is used to auto-select the most efficient deconvolution procedure for each chromatographic situation. In this way, non-optimal local solutions are avoided in cases of high overlap, and short computation times are obtained in situations of high resolution. A new algorithm, fitting both the origin…
On the Extension of the DIRECT Algorithm to Multiple Objectives
2020
AbstractDeterministic global optimization algorithms like Piyavskii–Shubert, direct, ego and many more, have a recognized standing, for problems with many local optima. Although many single objective optimization algorithms have been extended to multiple objectives, completely deterministic algorithms for nonlinear problems with guarantees of convergence to global Pareto optimality are still missing. For instance, deterministic algorithms usually make use of some form of scalarization, which may lead to incomplete representations of the Pareto optimal set. Thus, all global Pareto optima may not be obtained, especially in nonconvex cases. On the other hand, algorithms attempting to produce r…
Bayesian Unification of Gradient and Bandit-based Learning for Accelerated Global Optimisation
2017
Bandit based optimisation has a remarkable advantage over gradient based approaches due to their global perspective, which eliminates the danger of getting stuck at local optima. However, for continuous optimisation problems or problems with a large number of actions, bandit based approaches can be hindered by slow learning. Gradient based approaches, on the other hand, navigate quickly in high-dimensional continuous spaces through local optimisation, following the gradient in fine grained steps. Yet, apart from being susceptible to local optima, these schemes are less suited for online learning due to their reliance on extensive trial-and-error before the optimum can be identified. In this…
Predicting Heuristic Search Performance with PageRank Centrality in Local Optima Networks
2015
Previous studies have used statistical analysis of fitness landscapes such as ruggedness and deceptiveness in order to predict the expected quality of heuristic search methods. Novel approaches for predicting the performance of heuristic search are based on the analysis of local optima networks (LONs). A LON is a compressed stochastic model of a fitness landscape's basin transitions. Recent literature has suggested using various LON network measurements as predictors for local search performance.In this study, we suggest PageRank centrality as a new measure for predicting the performance of heuristic search methods using local search. PageRank centrality is a variant of Eigenvector centrali…
Coarse-Grained Barrier Trees of Fitness Landscapes
2016
Recent literature suggests that local optima in fitness landscapes are clustered, which offers an explanation of why perturbation-based metaheuristics often fail to find the global optimum: they become trapped in a sub-optimal cluster. We introduce a method to extract and visualize the global organization of these clusters in form of a barrier tree. Barrier trees have been used to visualize the barriers between local optima basins in fitness landscapes. Our method computes a more coarsely grained tree to reveal the barriers between clusters of local optima. The core element is a new variant of the flooding algorithm, applicable to local optima networks, a compressed representation of fitnes…
Shaping communities of local optima by perturbation strength
2017
Recent work discovered that fitness landscapes induced by Iterated Local Search (ILS) may consist of multiple clusters, denoted as funnels or communities of local optima. Such studies exist only for perturbation operators (kicks) with low strength. We examine how different strengths of the ILS perturbation operator affect the number and size of clusters. We present an empirical study based on local optima networks from NK fitness landscapes. Our results show that a properly selected perturbation strength can help overcome the effect of ILS getting trapped in clusters of local optima. This has implications for designing effective ILS approaches in practice, where traditionally only small per…
Communities of Local Optima as Funnels in Fitness Landscapes
2016
We conduct an analysis of local optima networks extracted from fitness landscapes of the Kauffman NK model under iterated local search. Applying the Markov Cluster Algorithm for community detection to the local optima networks, we find that the landscapes consist of multiple clusters. This result complements recent findings in the literature that landscapes often decompose into multiple funnels, which increases their difficulty for iterated local search. Our results suggest that the number of clusters as well as the size of the cluster in which the global optimum is located are correlated to the search difficulty of landscapes. We conclude that clusters found by community detection in local…
Multi-start methods for combinatorial optimization
2013
Abstract Multi-start methods strategically sample the solution space of an optimization problem. The most successful of these methods have two phases that are alternated for a certain number of global iterations. The first phase generates a solution and the second seeks to improve the outcome. Each global iteration produces a solution that is typically a local optimum, and the best overall solution is the output of the algorithm. The interaction between the two phases creates a balance between search diversification (structural variation) and search intensification (improvement), to yield an effective means for generating high-quality solutions. This survey briefly sketches historical devel…
Memetic Algorithms in Engineering and Design
2012
When dealing with real-world applications, one often faces non-linear and nondifferentiable optimization problems which do not allow the employment of exact methods. In addition, as highlighted in [104], popular local search methods (e.g. Hooke-Jeeves, Nelder Mead and Rosenbrock) can be ill-suited when the real-world problem is characterized by a complex and highly multi-modal fitness landscape since they tend to converge to local optima. In these situations, population based meta-heuristics can be a reasonable choice, since they have a good potential in detecting high quality solutions. For these reasons, meta-heuristics, such as Genetic Algorithms (GAs), Evolution Strategy (ES), Particle …
Gradient-based shape optimisation of ultra-wideband antennas parameterised using splines
2010
Methodology enabling the gradient-based optimisation of antennas parameterised using B-splines is presented. Use of the spline parametrisation allows us to obtain versatile new shapes, whereas the geometry can be represented with a small set of design variables. Moreover, good control over admissible geometries is retained. Advantages of gradient-based optimisation methods are quick convergence, and the fact that the obtained design can be guaranteed to be a local optimum. Focus of this study is to present techniques that enable the computation of exact gradients of the discrete problem, even though the complexity of the geometries does not permit establishing analytical expressions for the…